This project demonstrates how to use Snap Cloud (powered by Supabase) with Spectacles to build connected AR experiences. Snap Cloud is Snap's managed cloud platform providing database storage, real-time synchronization, cloud storage, and serverless edge functions.
The template includes five comprehensive examples:
- Example 1 - Auth & Tables: Database authentication and CRUD operations
- Example 2 - RealTime: Bidirectional data sync between devices and web
- Example 3 - Storage: Dynamic asset loading (3D models, images, audio)
- Example 4 - Edge Functions: Serverless cloud function execution
- Example 5 - Media Suite: Complete media capture, upload, and streaming
Designing Lenses for Spectacles offers all-new possibilities to rethink user interaction with digital spaces and the physical world. Get started using our Design Guidelines
- Lens Studio: v5.12.0+
- Spectacles OS Version: v5.64+
- Spectacles App iOS: v0.64+
- Spectacles App Android: v0.64+
- Snap Cloud Access: Account must be whitelisted
- Internet Connection: Required for all cloud operations
To update your Spectacles device and mobile app, refer to this guide.
You can download the latest version of Lens Studio from here.
To obtain the project folder, you need to clone the repository.
IMPORTANT: This project uses Git Large Files Support (LFS). Downloading a zip file using the green button on Github will not work. You must clone the project with a version of git that has LFS. You can download Git LFS here: https://git-lfs.github.com/.
Open Lens Studio and install the following from the Asset Library:
- Supabase Plugin - Access via Window > Supabase
- SupabaseClient (v0.0.10+) - For database and cloud operations
- Spectacles UI Kit - For button interactions
- Open Supabase Plugin:
Window > Supabase - Login with your Lens Studio credentials
- Click "Create a New Project"
- Click "Import Credentials" to generate a SupabaseProject asset
In the Preview Panel, set Device Type Override to Spectacles
Use the Supabase Plugin dashboard to create the required tables. Sample SQL and CSV data are provided in ExternalServicesExamples/example-1to4-mockup-data/.
Snap Cloud/
├── Assets/
│ └── Examples/
│ ├── Example1-AuthAndTables/ # Authentication & database operations
│ │ ├── BasicAuth.ts # Simple auth example
│ │ └── TableConnector.ts # Full CRUD operations
│ ├── Example2-RealTime/ # Real-time synchronization
│ │ └── RealtimeCursor.ts # Bidirectional cursor sync
│ ├── Example3-Storage/ # Cloud storage operations
│ │ └── StorageLoader.ts # Dynamic asset loading
│ ├── Example4-EdgeFunctions/ # Serverless functions
│ │ └── EdgeFunctionImgProcessing.ts
│ ├── Example5-Media/ # Media capture & streaming
│ │ └── Scripts/
│ │ ├── ImageCaptureUploader.ts
│ │ ├── VideoCaptureUploader.ts
│ │ ├── VideoStreamingController.ts
│ │ ├── AudioCaptureUploader.ts
│ │ ├── AudioStreamingController.ts
│ │ ├── CompositeCaptureUploader.ts
│ │ ├── CompositeStreamingController.ts
│ │ ├── CaptureUtilities.ts
│ │ └── UISectionManager.ts
│ └── SnapCloudRequirements.ts # Centralized configuration
├── ExternalServicesExamples/
│ ├── example-1to4-mockup-data/ # Sample data for examples 1-4
│ ├── media-example-web-viewers/ # Web viewers for streaming
│ ├── media-example-server-composite-stitcher/ # Video stitching server
│ └── realtime-example-web-cursor-controller/ # Web cursor controller
└── README.md
Basic authentication and database CRUD operations with automatic connection testing.
BasicAuth.ts - Minimal authentication setup:
async signInUser() {
const { data, error } = await this.client.auth.signInWithIdToken({
provider: 'snapchat',
token: '',
});
if (data && data.user) {
this.uid = JSON.stringify(user.id).replace(/^"(.*)"$/, '$1');
print('User ID: ' + this.uid);
}
}TableConnector.ts - Full database operations with UI:
// Insert data
async insertData(tableName: string, data: object) {
const { data: result, error } = await this.client
.from(tableName)
.insert(data)
.select();
return { result, error };
}
// Query data
async getData(tableName: string, limit: number = 10) {
const { data, error } = await this.client
.from(tableName)
.select('*')
.order('id', { ascending: false })
.limit(limit);
return { data, error };
}- Assign
SnapCloudRequirementscomponent with SupabaseProject - Create
test_tablein database (see Database Setup section) - Optionally assign RectangleButton for manual data retrieval
Bidirectional cursor synchronization between Spectacles and web browsers using WebSocket channels.
RealtimeCursor.ts - Two operation modes:
// BROADCAST MODE: Send cursor position to web
private broadcastPosition() {
const pos = this.cursorObject.getTransform().getLocalPosition();
const webX = (pos.x / this.coordinateScale) * this.perspectiveScale;
const webY = (pos.y / this.coordinateScale) * this.perspectiveScale;
this.channel.send({
type: 'broadcast',
event: 'cursor_move',
payload: { x: webX, y: webY, source: 'spectacles' }
});
}
// FOLLOW MODE: Receive cursor position from web
private handleCursorMove(payload: any) {
if (payload.source === 'web') {
this.targetPosition = new vec3(
payload.x * this.movementScale,
payload.y * this.movementScale + this.heightOffset,
this.cursorZPosition
);
}
}- Assign
SnapCloudRequirementscomponent - Set channel name for synchronization
- Assign cursor SceneObject to track/move
- Use web controller from
ExternalServicesExamples/realtime-example-web-cursor-controller/
Load 3D models, images, and audio files from Snap Cloud storage on demand.
StorageLoader.ts - Multi-asset loading:
// Load 3D model
private async loadGltfModel() {
const url = this.getStorageUrl(this.gltfPath);
const resource = await this.internetModule.createResourceFromUrl(url);
const gltfAsset = await this.remoteMediaModule.loadGltfFromResource(resource);
const sceneObject = gltfAsset.tryInstantiate(this.modelParent);
sceneObject.getTransform().setLocalScale(new vec3(this.modelScale, this.modelScale, this.modelScale));
}
// Load image texture
private async loadImageTexture() {
const url = this.getStorageUrl(this.imagePath);
const resource = await this.internetModule.createResourceFromUrl(url);
const texture = await this.remoteMediaModule.loadTextureFromResource(resource);
this.outputImage.mainPass.baseTex = texture;
}
// Load audio
private async loadAudioFile() {
const url = this.getStorageUrl(this.audioPath);
const resource = await this.internetModule.createResourceFromUrl(url);
const audioAsset = await this.remoteMediaModule.loadAudioFromResource(resource);
this.audioComponent.audioTrack = audioAsset;
this.audioComponent.play(1);
}- Create storage bucket in Snap Cloud
- Upload test assets from
ExternalServicesExamples/example-1to4-mockup-data/testAssets-ADD TO STORAGE BUCKET/ - Configure bucket name and file paths in Inspector
- Assign output components (Image, AudioComponent, parent SceneObject)
Execute serverless functions for image processing and external API calls.
EdgeFunctionImgProcessing.ts - Call edge functions:
async callEdgeFunction() {
const functionUrl = `${this.supabaseProject.url}/functions/v1/${this.functionName}`;
const response = await this.internetModule.fetch(functionUrl, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${this.supabaseProject.publicToken}`,
},
body: JSON.stringify({
imageUrl: this.inputImageUrl,
operations: ['grayscale', 'blur']
}),
});
const result = await response.json();
// Load processed image from result.outputUrl
}- Deploy edge function from
ExternalServicesExamples/example-1to4-mockup-data/testEdgeFunction-ADD TO EDGE FUNCTION CODE/ - Assign SnapCloudRequirements and function name
- Configure input image URL and output Image component
Complete media capture, upload, and streaming capabilities for Spectacles.
ImageCaptureUploader.ts - Capture and upload images:
// Capture from camera or composite texture
private async captureImage() {
const texture = this.useCompositeTexture ? this.compositeTexture : this.cameraTexture;
Base64.encodeTextureAsync(
texture,
async (base64String) => {
const binaryData = Base64.decode(base64String);
await this.uploadToStorage(binaryData, `images/${sessionId}/capture.jpg`);
},
() => { print('Encoding failed'); },
CompressionQuality.HighQuality,
EncodingType.Jpg
);
}Features:
- Camera-only or composite (AR content + background) capture
- Configurable JPEG quality
- High-quality still capture via Camera Module API
- Preview display
VideoCaptureUploader.ts - Record frame sequences:
// Capture frames during recording
private captureFrame() {
Base64.encodeTextureAsync(
this.cameraTexture,
(base64String) => {
this.frameBuffer.push({
frameNumber: this.frameCount++,
data: base64String,
timestamp: Date.now()
});
},
() => {},
this.useHighQuality ? CompressionQuality.HighQuality : CompressionQuality.LowQuality,
EncodingType.Jpg
);
}
// Upload all frames after recording
private async uploadFrames() {
for (const frame of this.frameBuffer) {
const binaryData = Base64.decode(frame.data);
await this.uploadToStorage(binaryData, `video/${sessionId}/frame_${frame.frameNumber}.jpg`);
}
}VideoStreamingController.ts - Live streaming via Realtime:
// Stream frames to viewers
private streamFrame() {
Base64.encodeTextureAsync(
this.cameraTexture,
(base64String) => {
this.channel.send({
type: 'broadcast',
event: 'video-frame',
payload: {
frame: base64String,
timestamp: Date.now(),
frameNumber: this.frameCount++
}
});
},
() => {},
CompressionQuality.LowQuality,
EncodingType.Jpg
);
}Features:
- Configurable FPS and quality
- Camera-only or composite mode
- Upload for later processing OR live streaming
- Web viewer available at
ExternalServicesExamples/media-example-web-viewers/video-stream-viewer.html
AudioCaptureUploader.ts - Record audio chunks:
// Process audio frames
private processAudioFrame() {
const shape = this.audioComponent.audioFrame.shape;
const audioData = new Float32Array(shape.x * shape.y);
this.audioComponent.audioFrame.getData(audioData);
this.audioBuffer.push({
audioFrame: audioData,
timestamp: Date.now()
});
}
// Convert to WAV and upload
private async uploadAudioChunk(chunkNumber: number, samples: Float32Array) {
const wavData = this.createWavFile(samples, this.sampleRate);
await this.uploadToStorage(wavData, `audio/${sessionId}/chunk_${chunkNumber}.wav`);
}AudioStreamingController.ts - Live audio streaming:
// Stream audio chunks via Realtime
private streamAudioChunk() {
const audioData = this.getAudioBuffer();
const base64Audio = Base64.encode(new Uint8Array(audioData.buffer));
this.channel.send({
type: 'broadcast',
event: 'audio-chunk',
payload: {
audio: base64Audio,
sampleRate: this.sampleRate,
timestamp: Date.now()
}
});
}Features:
- Configurable sample rate (8kHz - 48kHz)
- WAV format upload
- Chunk-based processing
- Web listener at
ExternalServicesExamples/media-example-web-viewers/audio-stream-listener.html
⚠️ Disclaimer: The video stitching example uses Railway as an optional hosting platform. Railway is not endorsed or affiliated with Snap Inc. This is provided as an example deployment pattern. You may use any Node.js hosting service (Heroku, Render, AWS, Google Cloud, etc.) that supports FFmpeg.
CompositeCaptureUploader.ts - Synchronized video + audio capture:
// Start synchronized recording
private async startRecording() {
this.sessionId = SessionUtility.generateSessionId('composite');
this.recordingStartTime = Date.now();
// Start audio recording
this.audioCapture.startRecording();
// Start frame capture timer
this.frameTimerEvent = this.createEvent('DelayedCallbackEvent');
this.frameTimerEvent.bind(() => {
this.captureVideoFrame();
if (this.isRecording) {
this.frameTimerEvent.reset(this.frameInterval / 1000);
}
});
// Create session metadata for stitching
await this.createSessionMetadata();
}
// Trigger server-side stitching
private async triggerStitching() {
const functionUrl = `${this.supabaseProject.url}/functions/v1/trigger-composite-stitch`;
await this.internetModule.fetch(functionUrl, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${this.supabaseProject.publicToken}`,
},
body: JSON.stringify({
sessionId: this.sessionId,
frameRate: this.frameRate,
sampleRate: this.sampleRate
}),
});
}Features:
- Synchronized video frames + audio chunks with shared session ID
- Metadata files for stitching relationship
- Server-side video stitching via Edge Function or external server
- Stitching server available at
ExternalServicesExamples/media-example-server-composite-stitcher/
Share stitched videos to social media platforms via optional third-party integration.
⚠️ Disclaimer: The social sharing example uses Ayrshare as an optional third-party service. Ayrshare is not endorsed or affiliated with Snap Inc. This is provided as an example of how to integrate with social media APIs. You may use any similar service or build your own integration.
CompositeCaptureUploader.ts - Social sharing controls:
// Inspector-configurable sharing options
@input public shareToSpotlight: boolean = false;
@input public captionInput: TextInputField;
@input public defaultCaption: string = "Captured with Spectacles ✨";
@input public useVerticalCrop: boolean = false; // 9:16 aspect ratio for Spotlight/Reels
// Trigger stitching with sharing options
private async triggerStitching() {
const spotlightCaption = this.captionInput?.text?.trim() || this.defaultCaption;
await this.internetModule.fetch(functionUrl, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${this.supabaseProject.publicToken}`,
},
body: JSON.stringify({
sessionId: this.sessionId,
frameRate: actualFrameRate,
sampleRate: this.sampleRate,
// Video format options
useVerticalCrop: this.useVerticalCrop, // 9:16 crop for Spotlight/Reels
// Social sharing options
shareToSpotlight: this.shareToSpotlight,
spotlightCaption: spotlightCaption,
}),
});
}Server-side sharing (in composite stitcher):
// Post to Snapchat Spotlight via Ayrshare
async function shareToSocialMedia(videoUrl, caption) {
const response = await axios.post(
'https://api.ayrshare.com/api/post',
{
post: caption,
mediaUrls: [videoUrl],
platforms: ['snapchat'], // Snapchat Spotlight
snapChatOptions: { spotlight: true }
},
{
headers: { 'Authorization': `Bearer ${AYRSHARE_API_KEY}` }
}
);
return response.data;
}Features:
- Optional sharing toggle (Inspector checkbox or UI switch)
- Custom caption input field
- 9:16 vertical crop for Spotlight/Reels format
- Server-side posting after video stitching completes
- Extensible to other platforms (Instagram, TikTok, YouTube Shorts)
Setup:
- Create account at Ayrshare (or similar service)
- Connect your Snapchat Creator account
- Add
AYRSHARE_API_KEYenvironment variable to your stitching server - Enable sharing in Inspector and provide caption
- Assign
SnapCloudRequirementscomponent - Create storage bucket named
specs-bucket - For composite mode: assign
CameraServiceandcompositeTexture - For audio: assign
AudioTrackAsset(microphone) - Deploy web viewers for streaming (optional)
- Deploy stitching server for composite video (optional)
CREATE TABLE test_table (
id BIGSERIAL PRIMARY KEY,
message TEXT NOT NULL,
sender TEXT,
timestamp TIMESTAMPTZ DEFAULT NOW(),
lens_session_id TEXT
);CREATE TABLE cursor_debug (
id BIGSERIAL PRIMARY KEY,
created_at TIMESTAMPTZ DEFAULT NOW(),
user_id TEXT NOT NULL,
x FLOAT NOT NULL,
y FLOAT NOT NULL,
timestamp TIMESTAMPTZ NOT NULL,
channel_name TEXT NOT NULL
);CREATE TABLE user_interactions (
id BIGSERIAL PRIMARY KEY,
action TEXT NOT NULL,
data JSONB,
timestamp TIMESTAMPTZ DEFAULT NOW(),
session_id TEXT
);For development, you can disable RLS:
ALTER TABLE test_table DISABLE ROW LEVEL SECURITY;
ALTER TABLE cursor_debug DISABLE ROW LEVEL SECURITY;
ALTER TABLE user_interactions DISABLE ROW LEVEL SECURITY;- Open any example scene
- Ensure Device Type is set to Spectacles
- Verify SnapCloudRequirements has SupabaseProject assigned
- Run and check console for authentication logs
- Deploy lens to Spectacles
- Ensure internet connectivity
- Test each example functionality
All examples require:
- SnapCloudRequirements Script - Centralized Supabase configuration
- SupabaseProject Asset - Created via Supabase Plugin
- Device Type: Spectacles - Set in preview panel
Authentication is automatic using signInWithIdToken({ provider: 'snapchat', token: '' }).
- Use anon public key from Snap Cloud dashboard
- Disable RLS for quick testing
- Enable Row Level Security (RLS)
- Create proper authentication policies
- Use service role key only server-side
- Validate all user inputs
If you have questions, connect with us on Reddit.
For Snap Cloud questions, visit Snap Cloud Documentation.
Feel free to provide improvements via merge requests.
Built by the Spectacles team
